Goto

Collaborating Authors

 diffusion step


A Appendix 564 B Diffusion process as ODE

Neural Information Processing Systems

In this section, we show that Cold Sampling is an approximation of the Euler method for (5). The intuition is as follows. B.2 Why is cold sampling better than naive sampling? Naive sampling does not have this property. The proof relied on applying definitions of Lipschitz functions twice.



Supplementary Materials - Adaptive Online Replanning with Diffusion Models Siyuan Zhou

Neural Information Processing Systems

In the supplementary, we first discuss the experimental details and hyperparameters in Section A. Section B, and further present the visualization in RLBench in Section C. Finally, we discuss how to MLP with 512 hidden units and Mish activations. The probability ϵ of random actions is set to 0. 03 in Stochastic Environments. So the sampled trajectories still lead to the collision. Figure 1 illustrates a problematic sampled trajectory after execution. We further evaluate the performance with different replanning steps in Table 1.







Categorical Reparameterization with Denoising Diffusion models

Gourevitch, Samson, Durmus, Alain, Moulines, Eric, Olsson, Jimmy, Janati, Yazid

arXiv.org Machine Learning

Gradient-based optimization with categorical variables typically relies on score-function estimators, which are unbiased but noisy, or on continuous relaxations that replace the discrete distribution with a smooth surrogate admitting a pathwise (reparameterized) gradient, at the cost of optimizing a biased, temperature-dependent objective. In this paper, we extend this family of relaxations by introducing a diffusion-based soft reparameterization for categorical distributions. For these distributions, the denoiser under a Gaussian noising process admits a closed form and can be computed efficiently, yielding a training-free diffusion sampler through which we can backpropagate. Our experiments show that the proposed reparameterization trick yields competitive or improved optimization performance on various benchmarks.


One-Step Effective Diffusion Network for Real-World Image Super-Resolution

Neural Information Processing Systems

The pre-trained text-to-image diffusion models have been increasingly employed to tackle the real-world image super-resolution (Real-ISR) problem due to their powerful generative image priors. Most of the existing methods start from random noise to reconstruct the high-quality (HQ) image under the guidance of the given low-quality (LQ) image. While promising results have been achieved, such Real-ISR methods require multiple diffusion steps to reproduce the HQ image, increasing the computational cost. Meanwhile, the random noise introduces uncertainty in the output, which is unfriendly to image restoration tasks. To address these issues, we propose a one-step effective diffusion network, namely OSEDiff, for the Real-ISR problem.